Modulation of LIP activity by predictive auditory and visual cues.

نویسندگان

  • Yale E Cohen
  • Ian S Cohen
  • Gordon W Gifford
چکیده

The lateral intraparietal area (area LIP) contains a multimodal representation of extra-personal space. To further examine this representation, we trained rhesus monkeys on the predictive-cueing task. During this task, monkeys shifted their gaze to a visual target whose location was predicted by the location of an auditory or visual cue. We found that, when the sensory cue was at the same location as the visual target, the monkeys' mean saccadic latency was faster than when the sensory cue and the visual target were at different locations. This difference in mean saccadic latency was the same for both auditory cues and visual cues. Despite the fact that the monkeys used auditory and visual cues in a similar fashion, LIP neurons responded more to visual cues than to auditory cues. This modality-dependent activity was also seen during auditory and visual memory-guided saccades but to a significantly greater extent than during the predictive-cueing task. Additionally, we found that the firing rate of LIP neurons was inversely correlated with saccadic latency. This study indicates further that modality-dependent differences in LIP activity do not simply reflect differences in sensory processing but also reflect the cognitive and behavioral requirements of a task.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Responses to auditory stimuli in macaque lateral intraparietal area. II. Behavioral modulation.

The lateral intraparietal area (LIP), a region of posterior parietal cortex, was once thought to be unresponsive to auditory stimulation. However, recent reports have indicated that neurons in area LIP respond to auditory stimuli during an auditory-saccade task. To what extent are auditory responses in area LIP dependent on the performance of an auditory-saccade task? To address this question, ...

متن کامل

The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception

Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker's face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 a...

متن کامل

Viseme comparison based on phonetic cues for varying speech accents

Human interaction through speech is a multisensory activity, wherein the spoken audio is perceived using both auditory and visual cues. However, in the absence of auditory stimulus, speech content can be perceived through lip reading, using the dynamics of the social context. In our earlier work [1], we had presented a tool enabling hearing impaired to understand spoken speech in videos, throug...

متن کامل

The use of auditory and visual context in speech perception by listeners with normal hearing and listeners with cochlear implants

There is a wide range of acoustic and visual variability across different talkers and different speaking contexts. Listeners with normal hearing (NH) accommodate that variability in ways that facilitate efficient perception, but it is not known whether listeners with cochlear implants (CIs) can do the same. In this study, listeners with NH and listeners with CIs were tested for accommodation to...

متن کامل

Impact of cued speech on audio-visual speech integration in deaf and hearing adults

For hearing and deaf people, speech perception involves an integrative process between auditory and lip read information. In order to disambiguate information from lips, manual cue may be added (Cued Speech). We examined how audio-visual integration is affected by the presence of manual cues. To address this issue, we designed an original experiment using audio-visual McGurk stimuli produced wi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Cerebral cortex

دوره 14 12  شماره 

صفحات  -

تاریخ انتشار 2004